Convergence of Gradient-Based Block Coordinate Descent Algorithms for Nonorthogonal Joint Approximate Diagonalization of Matrices

نویسندگان

چکیده

We propose a gradient-based block coordinate descent (BCD-G) framework to solve the joint approximate diagonalization of matrices defined on product complex Stiefel manifold and special linear group. Instead cyclic fashion, we choose optimization based Riemannian gradient. To update first variable in manifold, use well-known line search method. second group, four different kinds elementary transformations, construct three classes, GLU, GQU GU, then get BCD-G algorithms, BCD-GLU, BCD-GQU BCD-GU. establish global convergence weak these algorithms using Łojasiewicz gradient inequality under assumption that iterates are bounded. also Jacobi-type As case, GLU classes focus Jacobi-GLU Jacobi-GQU their as well. All results described this paper apply real case.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization

We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f (x1 , . . . , xN ) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence of the iterates to a stationary point is shown when either f is pseudoconvex in every pair of coordinate ...

متن کامل

Gradient-based joint block diagonalization algorithms: Application to blind separation of FIR convolutive mixtures

This article addresses the problem of the non-unitary joint block diagonalization of a given set of complex matrices. Two new algorithms are provided: the first is based on a classical gradient approach and the second is based on a relative gradient approach. For each algorithm, two versions are provided: the fixed stepsize and the optimal stepsize version. Computer simulations are provided to ...

متن کامل

Convergence Analysis of Gradient Descent Stochastic Algorithms

This paper proves convergence of a sample-path based stochastic gradient-descent algorithm for optimizing expected-value performance measures in discrete event systems. The algorithm uses increasing precision at successive iterations, and it moves against the direction of a generalized gradient of the computed sample performance function. Two convergence results are established: one, for the ca...

متن کامل

A Bayesian Approach to Approximate Joint Diagonalization of Square Matrices

We present a Bayesian scheme for the approximate diagonalisation of several square matrices which are not necessarily symmetric. A Gibbs sampler is derived to simulate samples of the common eigenvectors and the eigenvalues for these matrices. Several synthetic examples are used to illustrate the performance of the proposed Gibbs sampler and we then provide comparisons to several other joint dia...

متن کامل

Joint Approximate Diagonalization of Positive Deenite Hermitian Matrices

This paper provides an iterative algorithm to jointly approximately di-agonalize K Hermitian positive deenite matrices ? 1 , : : : , ? K. Speciically it calculates the matrix B which minimizes the criterion P K k=1 n k log det diag(BC k B) ? log det(BC k B)], n k being positive numbers, which is a measure of the deviation from diagonality of the matrices BC k B. The convergence of the algorithm...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Matrix Analysis and Applications

سال: 2023

ISSN: ['1095-7162', '0895-4798']

DOI: https://doi.org/10.1137/21m1456972